This week I wrote, edited, and finalized my final report for DREU and my Legacy Glossary for my successors in the lab.
This day I wrote the first draft for my final report. I sent it off to my director for feedback.
Afterward, I stopped training squeezenet at 87 epochs; the model had not 'improved' since epoch 49 (reaching 1.24% accuracy) and it was unlikely to get better in the next three epochs. I attempted to run the best saved model on the lab's initial setup but discovered it wouldn't run. I made some temporary changes to trainval_net to see if I could get the model to run at all, but I discovered an insurmountable issue: the pretrained model is not faster-rcnn, and therefore is incompatible with the lab's other work. A shame.
This day I added to my Legacy Glossary for the lab, where I detail all I learned and leave helpful information for what I did for those that will come after.
This day I finalized my DREU paper.
This day I finalized my Legacy Glossary.
This day I attempted to get the ImageNet-trained model to run on the lab's FasterRCNN scripts despite not being FasterRCNN. It did not succeed. A cynical end to this experience, admittedly. I said goodbye to my coworker and director, potentially with hope to see them again if I do come to Brown for post-grad degrees.